Faster tensor train decomposition for sparse data

نویسندگان

چکیده

In recent years, the application of tensors has become more widespread in fields that involve data analytics and numerical computation. Due to explosive growth data, low-rank tensor decompositions have a powerful tool harness notorious curse dimensionality . The main forms decomposition include CP decomposition, Tucker train (TT) etc. Each existing TT algorithms, including TT-SVD randomized TT-SVD, is successful field, but neither can both accurately efficiently decompose large-scale sparse tensors. Based on previous research, this paper proposes new quasi-optimal fast algorithm for with proven correctness upper bound computational complexity derived. It also produce no error slightly larger TT-ranks demand. experiments, we verify proposed much faster speed than advantages speed, precision versatility over TT-cross. And, it realize matrix was previously unachievable, enabling based algorithms be applied scenarios.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High-order Tensor Completion for Data Recovery via Sparse Tensor-train Optimization

In this paper, we aim at the problem of tensor data completion. Tensor-train decomposition is adopted because of its powerful representation ability and linear scalability to tensor order. We propose an algorithm named Sparse Tensortrain Optimization (STTO) which considers incomplete data as sparse tensor and uses first-order optimization method to find the factors of tensor-train decomposition...

متن کامل

A Randomized Tensor Train Singular Value Decomposition

The hierarchical SVD provides a quasi-best low rank approximation of high dimensional data in the hierarchical Tucker framework. Similar to the SVD for matrices, it provides a fundamental but expensive tool for tensor computations. In the present work we examine generalizations of randomized matrix decomposition methods to higher order tensors in the framework of the hierarchical tensors repres...

متن کامل

Tensor Train decomposition on TensorFlow (T3F)

Tensor Train decomposition is used across many branches of machine learning, but until now it lacked an implementation with GPU support, batch processing, automatic differentiation, and versatile functionality for Riemannian optimization framework, which takes in account the underlying manifold structure in order to construct efficient optimization methods. In this work, we propose a library th...

متن کامل

Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition

In this paper, we aim at the completion problem of high order tensor data with missing entries. The existing tensor factorization and completion methods suffer from the curse of dimensionality when the order of tensor N >> 3. To overcome this problem, we propose an efficient algorithm called TT-WOPT (Tensor-train Weighted OPTimization) to find the latent core tensors of tensor data and recover ...

متن کامل

Sparse and Low-Rank Tensor Decomposition

Motivated by the problem of robust factorization of a low-rank tensor, we study the question of sparse and low-rank tensor decomposition. We present an efficient computational algorithm that modifies Leurgans’ algoirthm for tensor factorization. Our method relies on a reduction of the problem to sparse and low-rank matrix decomposition via the notion of tensor contraction. We use well-understoo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2022

ISSN: ['0377-0427', '1879-1778', '0771-050X']

DOI: https://doi.org/10.1016/j.cam.2021.113972